211 research outputs found

    LNCS

    Get PDF
    We present two algorithmic approaches for synthesizing linear hybrid automata from experimental data. Unlike previous approaches, our algorithms work without a template and generate an automaton with nondeterministic guards and invariants, and with an arbitrary number and topology of modes. They thus construct a succinct model from the data and provide formal guarantees. In particular, (1) the generated automaton can reproduce the data up to a specified tolerance and (2) the automaton is tight, given the first guarantee. Our first approach encodes the synthesis problem as a logical formula in the theory of linear arithmetic, which can then be solved by an SMT solver. This approach minimizes the number of modes in the resulting model but is only feasible for limited data sets. To address scalability, we propose a second approach that does not enforce to find a minimal model. The algorithm constructs an initial automaton and then iteratively extends the automaton based on processing new data. Therefore the algorithm is well-suited for online and synthesis-in-the-loop applications. The core of the algorithm is a membership query that checks whether, within the specified tolerance, a given data set can result from the execution of a given automaton. We solve this membership problem for linear hybrid automata by repeated reachability computations. We demonstrate the effectiveness of the algorithm on synthetic data sets and on cardiac-cell measurements

    Fast approximation of centrality and distances in hyperbolic graphs

    Full text link
    We show that the eccentricities (and thus the centrality indices) of all vertices of a δ\delta-hyperbolic graph G=(V,E)G=(V,E) can be computed in linear time with an additive one-sided error of at most cδc\delta, i.e., after a linear time preprocessing, for every vertex vv of GG one can compute in O(1)O(1) time an estimate e^(v)\hat{e}(v) of its eccentricity eccG(v)ecc_G(v) such that eccG(v)e^(v)eccG(v)+cδecc_G(v)\leq \hat{e}(v)\leq ecc_G(v)+ c\delta for a small constant cc. We prove that every δ\delta-hyperbolic graph GG has a shortest path tree, constructible in linear time, such that for every vertex vv of GG, eccG(v)eccT(v)eccG(v)+cδecc_G(v)\leq ecc_T(v)\leq ecc_G(v)+ c\delta. These results are based on an interesting monotonicity property of the eccentricity function of hyperbolic graphs: the closer a vertex is to the center of GG, the smaller its eccentricity is. We also show that the distance matrix of GG with an additive one-sided error of at most cδc'\delta can be computed in O(V2log2V)O(|V|^2\log^2|V|) time, where c<cc'< c is a small constant. Recent empirical studies show that many real-world graphs (including Internet application networks, web networks, collaboration networks, social networks, biological networks, and others) have small hyperbolicity. So, we analyze the performance of our algorithms for approximating centrality and distance matrix on a number of real-world networks. Our experimental results show that the obtained estimates are even better than the theoretical bounds.Comment: arXiv admin note: text overlap with arXiv:1506.01799 by other author

    Efficient and exact sampling of simple graphs with given arbitrary degree sequence

    Get PDF
    Uniform sampling from graphical realizations of a given degree sequence is a fundamental component in simulation-based measurements of network observables, with applications ranging from epidemics, through social networks to Internet modeling. Existing graph sampling methods are either link-swap based (Markov-Chain Monte Carlo algorithms) or stub-matching based (the Configuration Model). Both types are ill-controlled, with typically unknown mixing times for link-swap methods and uncontrolled rejections for the Configuration Model. Here we propose an efficient, polynomial time algorithm that generates statistically independent graph samples with a given, arbitrary, degree sequence. The algorithm provides a weight associated with each sample, allowing the observable to be measured either uniformly over the graph ensemble, or, alternatively, with a desired distribution. Unlike other algorithms, this method always produces a sample, without back-tracking or rejections. Using a central limit theorem-based reasoning, we argue, that for large N, and for degree sequences admitting many realizations, the sample weights are expected to have a lognormal distribution. As examples, we apply our algorithm to generate networks with degree sequences drawn from power-law distributions and from binomial distributions.Comment: 8 pages, 3 figure

    The role of clathrin in post-golgi trafficking in toxoplasma gondii

    Get PDF
    Apicomplexan parasites are single eukaryotic cells with a highly polarised secretory system that contains unique secretory organelles (micronemes and rhoptries) that are required for host cell invasion. In contrast, the role of the endosomal system is poorly understood in these parasites. With many typical endocytic factors missing, we speculated that endocytosis depends exclusively on a clathrin-mediated mechanism. Intriguingly, in Toxoplasma gondii we were only able to observe the endogenous clathrin heavy chain 1 (CHC1) at the Golgi, but not at the parasite surface. For the functional characterisation of Toxoplasma gondii CHC1 we generated parasite mutants conditionally expressing the dominant negative clathrin Hub fragment and demonstrate that CHC1 is essential for vesicle formation at the trans-Golgi network. Consequently, the functional ablation of CHC1 results in Golgi aberrations, a block in the biogenesis of the unique secretory microneme and rhoptry organelles, and of the pellicle. However, we found no morphological evidence for clathrin mediating endocytosis in these parasites and speculate that they remodelled their vesicular trafficking system to adapt to an intracellular lifestyle

    The Incremental Cooperative Design of Preventive Healthcare Networks

    Get PDF
    This document is the Accepted Manuscript version of the following article: Soheil Davari, 'The incremental cooperative design of preventive healthcare networks', Annals of Operations Research, first published online 27 June 2017. Under embargo. Embargo end date: 27 June 2018. The final publication is available at Springer via http://dx.doi.org/10.1007/s10479-017-2569-1.In the Preventive Healthcare Network Design Problem (PHNDP), one seeks to locate facilities in a way that the uptake of services is maximised given certain constraints such as congestion considerations. We introduce the incremental and cooperative version of the problem, IC-PHNDP for short, in which facilities are added incrementally to the network (one at a time), contributing to the service levels. We first develop a general non-linear model of this problem and then present a method to make it linear. As the problem is of a combinatorial nature, an efficient Variable Neighbourhood Search (VNS) algorithm is proposed to solve it. In order to gain insight into the problem, the computational studies were performed with randomly generated instances of different settings. Results clearly show that VNS performs well in solving IC-PHNDP with errors not more than 1.54%.Peer reviewe

    Hybrid Meta-heuristics with VNS and Exact Methods: Application to Large Unconditional and Conditional Vertex p-Centre Problems

    Get PDF
    Large-scale unconditional and conditional vertex p-centre problems are solved using two meta-heuristics. One is based on a three-stage approach whereas the other relies on a guided multi-start principle. Both methods incorporate Variable Neighbourhood Search, exact method, and aggregation techniques. The methods are assessed on the TSP dataset which consist of up to 71,009 demand points with p varying from 5 to 100. To the best of our knowledge, these are the largest instances solved for unconditional and conditional vertex p-centre problems. The two proposed meta-heuristics yield competitive results for both classes of problems

    Involvement of global genome repair, transcription coupled repair, and chromatin remodeling in UV DNA damage response changes during development

    Get PDF
    Nucleotide Excision Repair (NER), which removes a variety of helix-distorting lesions from DNA, is initiated by two distinct DNA damage-sensing mechanisms. Transcription Coupled Repair (TCR) removes damage from the active strand of transcribed genes and depends on the SWI/SNF family protein CSB. Global Genome Repair (GGR) removes damage present elsewhere in the genome and depends on damage recognition by the XPC/RAD23/Centrin2 complex. Currently, it is not well understood to what extent both pathways contribute to genome maintenance and cell survival in a developing organism exposed to UV light. Here, we show that eukaryotic NER, initiated by two distinct subpathways, is well conserved in the nematode Caenorhabditis elegans. In C. elegans, involvement of TCR and GGR in the UV-induced DNA damage response changes during development. In germ cells and early embryos, we find that GGR is the major pathway contributing to normal development and survival after UV irradiation, whereas in later developmental stages TCR is predominantly engaged. Furthermore, we identify four ISWI/Cohesin and four SWI/SNF family chromatin remodeling factors that are implicated in the UV damage response in a developmental stage dependent manner. These in vivo studies strongly suggest that involvement of different repair pathways and chromatin remodeling proteins in UV-induced DNA repair depends on developmental stage of cells
    corecore